Skip to content

Fix MPS device support for Apple Silicon Macs#26

Open
btebedo wants to merge 1 commit intoubisoft:mainfrom
btebedo:main
Open

Fix MPS device support for Apple Silicon Macs#26
btebedo wants to merge 1 commit intoubisoft:mainfrom
btebedo:main

Conversation

@btebedo
Copy link
Copy Markdown

@btebedo btebedo commented Apr 9, 2026

Fix MPS device support for Apple Silicon Macs

Problem

StableDiffusion.setup() used a binary device check (cuda or cpu) that ignored Apple's MPS backend. On Apple Silicon, self.device resolved to cpu, while ComfyUI's model management moved the UNet to MPS. This caused a device mismatch — encode_text produced CPU-resident embeddings that the MPS-resident UNet rejected at the cross-attention layer:

RuntimeError: Tensor for argument input is on cpu but expected on mps

Changes (stable_diffusion.py)

  1. Device detection now includes MPS — replaced the one-liner torch.device("cuda" if torch.cuda.is_available() else "cpu") with a three-way check: CUDA → MPS → CPU.
  2. Text encoder .to() captures return value — changed self.text_encoder.to(...) to self.text_encoder = self.text_encoder.to(...) so the reference always points to the materialized model, which matters for lazy/meta tensors on MPS.
  3. Guard in encode_text — added self.text_encoder = self.text_encoder.to(self.device) before inference to force weight materialization, preventing MPS failures from lazy-loaded embedding weights.

No functional change on CUDA or CPU.


Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant